Tag: technology

Plotting Data in GDB

Using GDB to debug programs is great and all, but if you work with Scientific data then one really critical function is the ability to plot a long array or vector, to see how the solution is evolving at various stages.

After a bit of tinkering, I was able to take this code snippet and modify it for my needs. It had just a few problems:

  • The syntax they used for labeling things fell apart when I was using Eigen vectors, the -> became shell redirects.

  • It didn’t link very long (100+) entry arrays.

So I modified it like so:

# plot1d.gdb


# Copyright (C) 2008 Florian Lorenzen

# - Taken from https://sourceware.org/gdb/wiki/PlottingFromGDB

# Modified by Randall Hand to actually make it work.

# original version seemed to badly handle variable names

# like esf->data[0]@1024 (turning the > into a pipe redirect)


# Plot an expression that expands to {x1, x2, ..., xN}, i. e.

# N numbers using gnuplot.


# This file is for the GNU debugger 6.x.


# It writes temporary files named __plot1d.dump, __plot1d.dat, __plot1d.gp, so

# you should not have files of the same name in the working directory.


# It requires sed, awk, and gnuplot available in the $PATH.

# plot1d_opt_range <expr> <opt> <range>


# Plot the points of <expr> passing <opt> as plot options using

# <range> in the set yrange command.

define plot1d_opt_range

shell rm -f /tmp/__plot1d.dump /tmp/__plot1d.dat /tmp/__plot1d.gp

set logging file /tmp/__plot1d.dump

set logging on

set logging redirect on

set height 0

set print elements unlimited

output $arg0

set logging off

set logging redirect off

shell awk '{printf("%s", $0)}' < /tmp/__plot1d.dump | \

sed 's/^{\(.*\)}$/\1/;s/, */\n/g' > /tmp/__plot1d.dat

shell echo 'plot "/tmp/__plot1d.dat" title "$arg0"; pause -1 "Press enter to continue"' > /tmp/__plot1d.gp

shell gnuplot /tmp/__plot1d.gp

# shell rm -f /tmp/__plot1d.dump /tmp/__plot1d.dat /tmp/__plot1d.gp


# plot1d <expr>


# Just plot the points of <expr>.

define plot1d

plot1d_opt_range $arg0 "" "[*:*]"


# plot1d_opt <expr> <opt>


# Plot the points of <expr> passing <opt> to the

# plot command after the datafile. So, one can pass

# "with lines" here.

define plot1d_opt

plot1d_opt_range $arg0 $arg1 "[*:*]"


With this file placed somewhere and executed, you can then simply do plot1d data[0]@128 to get a nice plot of the first 128 values of data.

And if you like it so much you want to keep it, simply add source plot1d.gdb (which a path to wherever you put it) to your ~/.gdbinit

Neat Bash Tricks

I’ve been writing lots of bash scripts lately, so I thought it would be a good idea to document a few of the little tricks I’ve been using.

First off, to make a script that has to be executed in bash:
if [ ! "$BASH_VERSION" ] ; then
echo "Please do not use sh to run this script ($0), just execute it directly" 1>&2
exit 1

This was needed for a script that makes uses of Bash syntax for `for` loop and some basic math, and I had a user reporting errors when they tried to do `sh script.sh`. `sh` ignores the opening `#!/bin/bash` line and tries to execute the script directly, and fails when it hits those lines.

Then, to make a script that has to be sourced in the global environment (to set environment variables like `PATH` and `LD_LIBRARY_PATH` for the user).

if [[ "$(basename "$0")" == "script.sh" ]]; then
echo "ERROR: Don't run $0, source it" >&2
exit 1

replace `script.sh` with the name of the actual file.

Then, to URL Encode a parameter in bash (I used this to encode a user provided git password into the url), use this function:

rawurlencode() {
local string="${1}"
local strlen=${#string}
local encoded=""

for (( pos=0 ; pos
case "$c" in
[-_.~a-zA-Z0-9] ) o="${c}" ;;
* ) printf -v o '%%%02x' "'$c"
echo "${encoded}" # You can either set a return variable (FASTER)
REPLY="${encoded}" #+or echo the result (EASIER)... or both... :p

Then you can use it like so:

echo Enter your GitHub username:

echo Enter your GitHub Password:

GIT_AUTH=$( rawurlencode "${GIT_USER}" ):$( rawurlencode "${GIT_PASSWORD}" )
git clone http://${GIT_AUTH}@github.com/user/repo

*yeah yeah, I know.. SSH keys are better… tell it to the users, not me*

Finally, using `getopt`. It’s a big tool, but the basics of using it:

while getopts "aj:s" o; do
case "${o}" in
shift $((OPTIND-1))

Important bits:

* the `getopts` line: Include each character recognized as a flag, and following it with a `:` if it requires an argument.
* Then include a case for each one in the big switch.. use `$(OPTARG)` to get the argument, if it’s needed.

Makes argument parsing simple in bash scripts simple… Also define a `usage` function to spew out the supported options, and you’re golden.

Fishshell: CMake & finding source directories

CMake is a great tool for makefile generation, far better than old arcane configure scripts and such, but it’s great out-of-source build support can lead to a common annoyance of constantly jumping back and forth between build and source directories, or having multiple build directories for a single source checkout. In my case, I frequently find myself forgetting exactly where the correct source tree is when I’m working in a build-tree.

So, here’s a little fish function called `prompt_src` that you can add to your prompt to let it always show you the source directory for the build tree you’re in, and also show the current git version that you’re working from. The image above shows my OpenCV/build directory (the response is in yellow indicating it’s not a Git source tree), and then my main application directory showing in Red because it’s been modified, but it’s the develop branch.

# src
function prompt_src --description 'Find details on this builds SOURCE dir'
set -l cdir (pwd)
while [ $cdir != "/" ]
if [ -e $cdir/CMakeCache.txt ]
set -l SourceDir ( cat $cdir/CMakeCache.txt | grep SOURCE_DIR | head -n 1 | cut -d '=' -f 2)
if [ -d $SourceDir/.git ]
set -l gitinfo (git --git-dir=$SourceDir/.git --work-tree=$SourceDir status -sb --untracked-files=no)
set -l branch ( echo $gitinfo[1] | cut -b 4- )
set -l branch_color (set_color red)\[M\]
if test (count $gitinfo) -eq 1
set branch_color (set_color green)

echo \* Builds (set_color green)$SourceDir $branch_color \($branch\) (set_color normal)
echo \* Builds (set_color yellow)$SourceDir (set_color normal)
set cdir (dirname $cdir)

FishShell: Create & Expand compressed archives

If you spend much time in a terminal, be it on Mac or Linux, one thing you wind up doing often is creating and decompressing tarballs. Be they tgz, tbs, or just plain tar files, they’re the archive format of choice for folks working in *nix environments due to their universal support. Most commonly, the only way to get any real status information is to turn on “verbose" mode which outputs each filename as it goes. That’s not terribly useful for large archives.

If you install the ‘pv’ tool, you can partner it with some commandline-fu to get nice progress bars. But why deal with it, when you can write a fish function to do it for you!

Here's a script (expand.fish) that decompresses a variety of formats:

function expand -d 'Decompress a file' -a filename
set -l extension ( echo $filename | awk -F . '{print $NF}')
switch $extension
case tar
echo "Un-tar-ing $filename..."
pv $filename | tar xf -
case tgz
echo "Un-tar/gz-ing $filename..."
pv $filename | tar zxf -
case tbz
echo "Un-tar/bz-ing $filename..."
pv $filename | tar jxf -
case gz
echo "Un-gz-ing $filename..."
pv $filename | gunzip -
case bz
echo "Un-bz-ing $filename..."
pv $filename | bunzip2 -
case zip
echo "Un-zipping $filename..."
unzip $filename
case '*'
echo I don\'t know what to do with $extension files.

And here's a matching script for creating tarballs:

function tarball -d "Create a tarball of collected files" -a filename
echo "Creating a tarball of $filename"
if [ -e $filename ]
echo "Sorry, $filename already exists."
set -l args $argv[2..-1]
set -l size (du -ck $args | tail -n 1 | cut -f 1)
set -l extension ( echo $filename | awk -F . '{print $NF}')

switch $extension
case tgz
tar cf - $args | pv -p -s {$size}k | gzip -c > $filename
case tbz
tar cf - $args | pv -p -s {$size}k | bzip2 -c > $filename
case '*'
echo "I don't know how to make a '$extension' file."
set -l shrunk (du -sk $filename | cut -f 1)
set -l ratio ( math "$shrunk * 100.0 / $size")
echo Reduced {$size}k to {$shrunk}k \({$ratio}%\)


Tar/Untar on OSX/Linux with pretty progress bars

On a console, install 'pv' (available from homebrew and apt-get) and then use the following to get a nice progress bar with percentage progress and ETA when decompressing:

pv file.tgz | tar xzf - -C target_directory

And use this when compressing for the same:

SIZE=`du -sk folder-with-big-files | cut -f 1`
tar cvf - folder-with-big-files | pv -p -s ${SIZE}k | bzip2 -c > big-files.tar.bz2

Works with gzip instead of bzip too!

Handy Git Configuration

Spending lots of time in git lately, I thought I’ld log my git environment here since I keep having to replicate it on various machines.

git config --global rerere.enabled true

git config --global alias.lg "log --color --graph --pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an>%Creset' --abbrev-commit"
git config --global push.default current
git config --global user.name “Randall Hand"
git config --global user.email “"
git config --global color.ui true
git config --global core.editor vim
git config --global core.autocrlf input

And on a mac, I have a few more:

git config --global credential.helper osxkeychain
git config --global core.editor ‘subl -w'
git config --global mergetool.sublime.cmd=subl -w $MERGED
git config --global mergetool.sublime.trustexitcode=false
git config --global merge.tool=sublime

Anyone else have any neat things?
Addition Dec-09:

git config --global branch.autosetuprebase always

And for any existing branches that you want to “convert" over to always rebase, execute this in bash:

for branch in $(git for-each-ref --format='%(refname)' -- refs/heads/); do git config branch."${branch#refs/heads/}".rebase true; done

More Xeno & Fish!

Here’s another little snippet for anyone interested… I wanted my Terminal prompt to show all active sessions, for easy memory that I may have left one open, but it was a little dull. Being a Viz guy, what can I do to make it better? _Add Color!_ So below, find a little fish function that you can add to your fish_prompt.fish script (or just call manually) for nicely colored output as shown above.

# xeno_list
function xeno_list --description 'Colored xeno list results'
for s in ( xeno-list );
set -l xen_id (echo $s | cut -d ':' -f 1)
set -l xen_desc (echo $s | cut -d ':' -f 2-)

set -l xen_array (echo $s | tr ' ' \n)
set -l statuscolor (set_color green)
set -l desccolor (set_color normal)
if test $xen_array[-1] = "unsynced"
set statuscolor (set_color red)
set desccolor (set_color yellow)

echo $statuscolor\[$xen_id\] $desccolor $xen_desc (set_color normal)

FishShell and Xeno.io

A month ago or so, a friend of mine turned me onto a new unix shell called [FishShell](http://fishshell.com). It shares some similarities with other terminals, but offers lots of really nice features. It has a vastly improved autocompletion feature (including an amazing tool that parses all your installed man pages and generates an autocompletion database). It took me a while to work out some of the syntax (no more `export PATH=A`, but rather `set -x PATH A`). I’ve switched all my machines over to it (Mac and Linux) and I’m loving it so far.

Then, the other day I found out about a great tool called [Xeno.io](http://xeno.io). It’s a tool that combines git and ssh into a single stream that lets you edit remote files with local editors. I hooked it up with Sublime (a quick `set -xU EDITOR subl` and `set -xU GIT_EDITOR ’subl -w’`), and it’s a great way to edit code on remote systems without having to use screen and such. And if your connection drops, no worries! It’s stored in git, and when it comes back it’ll resync.

So I spent some time merging the two, and build the following nice autocompletes for xeno that support the major operations, and filling in open sessions. Hope someone out there finds it useful!

# xeno

function __fish_xeno_available_sessions
xeno-list | cut -d ':' -f 1

function __fish_xeno_needs_command
set cmd (commandline -opc)
if [ (count $cmd) -eq 1 -a $cmd[1] = 'xeno' ]
return 0
return 1

function __fish_xeno_using_command
set cmd (commandline -opc)
if [ (count $cmd) -gt 1 ]
if [ $argv[1] = $cmd[2] ]
return 0
return 1

complete -f -c xeno -n '__fish_xeno_needs_command' -a 'list stop resume sync ssh edit'

complete -f -c xeno -n '__fish_xeno_needs_command' -a list --description 'List open sessions'
complete -f -c xeno -n '__fish_xeno_using_command list' -a '(__fish_xeno_available_sessions)'

complete -f -c xeno -n '__fish_xeno_needs_command' -a stop --description 'Shutdown a session'
complete -f -c xeno -n '__fish_xeno_using_command stop' -a '(__fish_xeno_available_sessions)'

complete -f -c xeno -n '__fish_xeno_needs_command' -a resume --description 'Resume a previously used session'
complete -f -c xeno -n '__fish_xeno_using_command resume' -a '(__fish_xeno_available_sessions)'

complete -f -c xeno -n '__fish_xeno_needs_command' -a sync --description 'Force a sync of data'
complete -f -c xeno -n '__fish_xeno_using_command sync' -a '(__fish_xeno_available_sessions)'

complete -f -c xeno -n '__fish_xeno_needs_command' -a ssh --description 'SSH to a host to prepare for editing'
complete -f -c xeno -n '__fish_xeno_needs_command' -a edit --description 'Edit a file'