The Cloud Fumbles

For the last several years, "ransomware" has been one of the bigger problems on the internet. Half computer virus, half phishing attempt, it's the frustrating encryption or theft or personal data that can only be recovered by paying the attacker some nominal sum to get things restored.. It's incredibly annoying, but good antivirus and security measures can generally protect people. Personally, I've never dealt with it but it seems to pop up in the news every few months.
However, now it seems hackers have taken Ransomware to a whole new level. Earlier today, SVN hosting webservice "Code Spaces" was shut down when an attacker demanded money for return of their Amazon EC2 control panel, and when they tried to stop him (by doing exactly what I would have done: Changing passwords and removing access) he irreversibly erased all of their custom data and repos! Similarly the same thing almost happened to Feedly but they were more successful in averting disaster. Evernote got hit with a DDOS as well, but (as far as they are reporting) there was no extortion attempt.
Whether or not these incidents are related, it's a scary time.. For the last several years, lots of people have migrated data to the cloud. Documents and communications exist in Google services, backups exist in Carbonite & Crashplan, personal memories exist in Flickr and Instagram, bits of data exist in services like GitHub and Evernote. When suddenly the cloud is no safer than your personal computer, maybe it's time to start moving more towards "personal clouds" ?

Neat Bash Tricks

I’ve been writing lots of bash scripts lately, so I thought it would be a good idea to document a few of the little tricks I’ve been using.

First off, to make a script that has to be executed in bash:

#!/bin/bash

if [ ! "$BASH_VERSION" ] ; then
  echo "Please do not use sh to run this script ($0), just execute it directly" 1>&2
  exit 1
fi

This was needed for a script that makes uses of Bash syntax for for loop and some basic math, and I had a user reporting errors when they tried to do sh script.sh. sh ignores the opening #!/bin/bash line and tries to execute the script directly, and fails when it hits those lines.

Then, to make a script that has to be sourced in the global environment (to set environment variables like PATH and LD_LIBRARY_PATH for the user).

#!/bin/bash

if [[ "$(basename "$0")" == "script.sh" ]]; then
  echo "ERROR: Don't run $0, source it" >&2
  exit 1
fi

replace script.sh with the name of the actual file.

Then, to URL Encode a parameter in bash (I used this to encode a user provided git password into the url), use this function:

rawurlencode() {

  local string="${1}"
  local strlen=${#string}
  local encoded=""

  for (( pos=0 ; pos
    c=${string:$pos:1}
    case "$c" in
      [-_.~a-zA-Z0-9] ) o="${c}" ;;
      * )               printf -v o '%%%02x' "'$c"
    esac
    encoded+="${o}"
  done
  echo "${encoded}"    # You can either set a return variable (FASTER)
  REPLY="${encoded}"   #+or echo the result (EASIER)... or both... :p
}

Then you can use it like so:

echo Enter your GitHub username:

read GIT_USER

echo Enter your GitHub Password:
read GIT_PASSWORD

GIT_AUTH=$( rawurlencode "${GIT_USER}" ):$( rawurlencode "${GIT_PASSWORD}" )

git clone http://${GIT_AUTH}@github.com/user/repo

yeah yeah, I know.. SSH keys are better… tell it to the users, not me

Finally, using getopt. It’s a big tool, but the basics of using it:

USE_SSH=0
AUTOMATED=0
BUILD_WIDTH=1

while getopts "aj:s" o; do
    case "${o}" in
        s)
            USE_SSH=1
            ;;
        a)
            AUTOMATED=1
            ;;
        j)
            BUILD_WIDTH=${OPTARG}
            ;;
        *)
            usage
            ;;
    esac
done
shift $((OPTIND-1))

Important bits:

  • the getopts line: Include each character recognized as a flag, and following it with a : if it requires an argument.
  • Then include a case for each one in the big switch.. use $(OPTARG) to get the argument, if it’s needed.

Makes argument parsing simple in bash scripts simple… Also define a usage function to spew out the supported options, and you’re golden.

Sublime Text, Xeno, and Build Systems

So I'm still loving the use of xeno.io for editing files on remote machines. The only problem I've had so far, and it's not really a problem as much as an annoyance, is that I can't use Sublime Text's built-in Build features. I'ld not really noticed how much I missed it until I wound up working back on a machine locally and I could hammer F7 to build, then F4 to jump to the first error. Install the Build Next plugin and it really becomes a nice way to do development for C/C++ (Still missing the fabulous SublimeCLang pluin tho.)

So, I spent a bit of time and build a custom build-system that works with it. It works with a shell script (below) to find the remote host and directory that the code really resides in, then ssh'es over to run the build. Then it takes any filenames in the output (for error messages or warnings) and converts them back to the local path equivalents, so that F4 works for taking you to the source of errors.

It works pretty well. I also got it to force a xeno sync prior to a build, eliminating one of my major annoyances. It works with either make or ninja (another favorite tool of mine), and works amazingly well.

To get this to work, first you'll need the following shell script, which I call xeno-build.sh:

#!/bin/bash
if dirname $2 | grep .xeno
then
    echo This looks like a Xeno project.
    SOURCEDIR=`dirname $2`
    SYNCID=`git config xeno.syncprocessid`
    echo Syncing...
    /usr/local/bin/xeno-sync $SYNCID
    HOST=`git remote -v | grep origin | grep push | cut -d ' ' -f 1 | cut -f 2 | cut -d '/' -f 3`
    RDIR=`git config xeno.remotepath`
    BUILDDIR=`dirname $RDIR`/build
    echo Host = $HOST
    echo BUILDDIR = $BUILDDIR
    echo ssh $HOST "cd $BUILDDIR && $1" \| sed s=$RDIR=$SOURCEDIR=g
    ssh $HOST "cd $BUILDDIR && env TERM=screen256color && $1" | /usr/local/opt/gnu-sed/libexec/gnubin/sed -u s=$RDIR=$SOURCEDIR=g
fi

You might need to edit the BUILDDIR line to match how you do builds . I use CMake and do out-of-source builds, so I always have a root/src and root/build directory.

Then, in your Sublime Text 3 Packages folder, you'll want a build systems file like this:

{
    "cmd":
        [
            "/Users/rhand/bin/xeno-build.sh",
            "make -j9",
            "$project_path/here"
        ],
    "file_regex": "^(..[^:]*):([0-9]+):?([0-9]+)?:? (.*)$",
    "variants":
    [
        {
            "name": "Make Tests",
            "cmd":
            [
                "/Users/rhand/bin/xeno-build.sh",
                "make test",
                "$project_path/here"
            ]
        },
        {
            "name": "Make -j4",
            "cmd":
            [
                "/Users/rhand/bin/xeno-build.sh",
                "make -j4",
                "$project_path/here"
            ]
        },
        {
            "name": "Make Single",
            "cmd":
            [
                "/Users/rhand/bin/xeno-build.sh",
                "make",
                "$project_path/here"
            ]
        },
        {
            "name": "Clean",
            "cmd":
            [
                "/Users/rhand/bin/xeno-build.sh",
                "make clean",
                "$project_path/here"
            ]
        }
    ]
}

Simply replace "make" with "ninja" here and you can do that too. The basic usage of the script is xeno-build.sh command sourcefile. The command can be anything really (make, ninja, rm, or whatever build system you like), and the sourcefile is any file in the source directory (that really doesn't even have to exist).

So, hopefully someone out there will find this useful. I've made it a standard part of my builds.

A Better Way to fix OSX Calendar & Google Hangouts

So, ever since my post a while back when I was working on OSX Calendar and Google Hangout, I’ve been annoyed by the constant need to drag appointments onto the little automator robot each time. I grew to loathe that little robot, sucking precious seconds of my time prior to each meeting.. The, taking almost 2 minutes every time to launch automator, process my calendar, and then update it. The end result was that I went back to Google Calendar.

So, I’ve spent some time over the last few weekends and came up with a much better options. The result is available on github as CalendarHangout. Setup and run the script, and see every calendar event in Calendar suddenly have a nice little URL link at the bottom that you can click on to launch hangout.

Simply download (or clone) that, and edit the script with the name of your calendar in Google (typically your email address). Then run it once by hand, and go through the steps to authenticate it (GData3 API requires OAuth authentication). Then, you can toss it into a cronjob or just run it on occasion to fix everything 2-weeks out!

So what’s it going? It gets a list of all appointments on the 2-week horizon from Google, then uses AppleScript to update each one with the Hangout Link. Due to the obnoxious nature of both companies involved (Google and Apple), this is far more frustrating than it needs to be:

  • Google doesn’t support the RFC standard “URL” field, which would make this trivial.
  • Apple doesn’t offer any robust way to get occurrences of repeating events.
  • Apple Calendar tends to take a long time & lots of CPU cycles to make these changes, and applescript seems to be the only way to talk to them.

So, I use Google Calendar’s API to get events that are happening in the next two weeks, where AppleScript would fail because of their appalling recurrence support. Then, I use AppleScript to update the events on your local calendar (connecting them via UID), since Google doesn’t support the decades-old “URL” field.

And it all works! It’s slow, taking approximately 10 minutes to run on my machine (and maxing out all CPU’s as it does). I know, that’s a stupid computational load but seems to be due to a known bug in OSX Calendar (look for talk of tccd and applescript calendar slow).

So try it out, if you find any glitches or make any improvements post in the comments! (or even better, submit a pull request via GitHub!)

Fishshell: CMake & finding source directories


CMake is a great tool for makefile generation, far better than old arcane configure scripts and such, but it’s great out-of-source build support can lead to a common annoyance of constantly jumping back and forth between build and source directories, or having multiple build directories for a single source checkout. In my case, I frequently find myself forgetting exactly where the correct source tree is when I’m working in a build-tree.

So, here’s a little fish function called prompt_src that you can add to your prompt to let it always show you the source directory for the build tree you’re in, and also show the current git version that you’re working from. The image above shows my OpenCV/build directory (the response is in yellow indicating it’s not a Git source tree), and then my main application directory showing in Red because it’s been modified, but it’s the develop branch.

# src
function prompt_src --description 'Find details on this builds SOURCE dir'
    set -l cdir (pwd)
    while [ $cdir != "/" ]
        if [ -e $cdir/CMakeCache.txt ]
            set -l SourceDir ( cat $cdir/CMakeCache.txt | grep SOURCE_DIR | head -n 1 | cut -d '=' -f 2)
            if [ -d $SourceDir/.git ]
                set -l gitinfo (git --git-dir=$SourceDir/.git --work-tree=$SourceDir status -sb --untracked-files=no)
                set -l branch ( echo $gitinfo[1] | cut -b 4- )
                set -l branch_color (set_color red)\[M\]
                if test (count $gitinfo) -eq 1
                    set branch_color (set_color green)
                end

                echo \* Builds (set_color green)$SourceDir $branch_color \($branch\) (set_color normal)
                return
            else
                echo \* Builds (set_color yellow)$SourceDir (set_color normal)
                return
            end
        end
        set cdir (dirname $cdir)
    end
end

FishShell: Create & Expand compressed archives

If you spend much time in a terminal, be it on Mac or Linux, one thing you wind up doing often is creating and decompressing tarballs. Be they tgz, tbs, or just plain tar files, they’re the archive format of choice for folks working in *nix environments due to their universal support. Most commonly, the only way to get any real status information is to turn on “verbose” mode which outputs each filename as it goes. That’s not terribly useful for large archives.

If you install the ‘pv’ tool, you can partner it with some commandline-fu to get nice progress bars. But why deal with it, when you can write a fish function to do it for you!

Here's a script (expand.fish) that decompresses a variety of formats:

#expand
function expand -d 'Decompress a file' -a filename
    set -l extension ( echo $filename | awk -F . '{print $NF}')
    switch $extension
        case tar
            echo "Un-tar-ing $filename..."
            pv $filename | tar xf -
        case tgz
            echo "Un-tar/gz-ing $filename..."
            pv $filename | tar zxf -
        case tbz
            echo "Un-tar/bz-ing $filename..."
            pv $filename | tar jxf -
        case gz
            echo "Un-gz-ing $filename..."
            pv $filename | gunzip -
        case bz
            echo "Un-bz-ing $filename..."
            pv $filename | bunzip2 -
        case zip
            echo "Un-zipping $filename..."
            unzip $filename
        case '*'
            echo I don\'t know what to do with $extension files.
    end
end

And here's a matching script for creating tarballs:

#tarball
function tarball -d "Create a tarball of collected files" -a filename
    echo "Creating a tarball of $filename"
    if [ -e $filename ]
        echo "Sorry, $filename already exists."
        return
    end
    set -l args $argv[2..-1]
    set -l size (du -ck $args | tail -n 1 | cut -f 1)
    set -l extension ( echo $filename | awk -F . '{print $NF}')

    switch $extension
        case tgz
            tar cf - $args | pv -p -s {$size}k | gzip -c > $filename
        case tbz
            tar cf - $args | pv -p -s {$size}k | bzip2 -c > $filename
        case '*'
            echo "I don't know how to make a '$extension' file."
            return
    end
    set -l shrunk (du -sk $filename | cut -f 1)
    set -l ratio ( math "$shrunk * 100.0 / $size")
    echo Reduced {$size}k to {$shrunk}k \({$ratio}%\)
end

Enjoy!

Xeno opening Sublime Projects

I really love Xeno, but one of my biggest gripes is that even if I configure it to use Sublime Text as my editor, it opens the entire directory and not the project file. For most maybe that’s not an issue, but for me that means a loss of indentions, spacing styles, clang options, and lots more.

So I wrote this script called “subl-project.sh”:


#!/bin/sh
echo Looking in $1
found=0
for s in `find $1 -name \*.sublime-project -maxdepth 1 `; do
    echo "Opening a sublime project: $s"
    subl -p $s
    found=1
done

if [ $found = 0 ]; then
    echo "No project found, opening $1"
    subl $1
fi

And then executed this from the command line:

xeno config core.editor ~/bin/subl-project.sh

And tada! Now when I begin or resume an editing session with xeno, it defaults to opening any Sublime Project files it finds first! If none are found, then it just does the usual.

Tar/Untar on OSX/Linux with pretty progress bars

On a console, install 'pv' (available from homebrew and apt-get) and then use the following to get a nice progress bar with percentage progress and ETA when decompressing:

    pv file.tgz | tar xzf - -C target_directory

And use this when compressing for the same:

    SIZE=`du -sk folder-with-big-files | cut -f 1`
    tar cvf - folder-with-big-files | pv -p -s ${SIZE}k | bzip2 -c > big-files.tar.bz2

Works with gzip instead of bzip too!

Handy Git Configuration

Spending lots of time in git lately, I thought I’ld log my git environment here since I keep having to replicate it on various machines.

git config --global  rerere.enabled true

git config --global  alias.lg "log --color --graph --pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an>%Creset' --abbrev-commit"
git config --global  push.default current
git config --global  user.name “Randall Hand”
git config --global  user.email “”
git config --global color.ui true
git config --global core.editor vim 
git config --global core.autocrlf input

And on a mac, I have a few more:

git config --global credential.helper osxkeychain
git config --global core.editor ‘subl -w'
git config --global   mergetool.sublime.cmd=subl -w $MERGED
git config --global   mergetool.sublime.trustexitcode=false
git config --global   merge.tool=sublime

Anyone else have any neat things?

Addition Dec-09:

git config --global branch.autosetuprebase always

And for any existing branches that you want to “convert” over to always rebase, execute this in bash:

for branch in $(git for-each-ref --format='%(refname)' -- refs/heads/); do git config branch."${branch#refs/heads/}".rebase true; done