On a console, install 'pv' (available from homebrew and apt-get) and then use the following to get a nice progress bar with percentage progress and ETA when decompressing:

pv file.tgz | tar xzf - -C target_directory

And use this when compressing for the same:

SIZE=`du -sk folder-with-big-files | cut -f 1`
tar cvf - folder-with-big-files | pv -p -s ${SIZE}k | bzip2 -c > big-files.tar.bz2

Works with gzip instead of bzip too!
Spending lots of time in git lately, I thought I’ld log my git environment here since I keep having to replicate it on various machines.

git config --global rerere.enabled true

git config --global alias.lg "log --color --graph --pretty=format:'%Cred%h%Creset -%C(yellow)%d%Creset %s %Cgreen(%cr) %C(bold blue)<%an>%Creset' --abbrev-commit"
git config --global push.default current
git config --global user.name “Randall Hand"
git config --global user.email “"
git config --global color.ui true
git config --global core.editor vim
git config --global core.autocrlf input

And on a mac, I have a few more:

git config --global credential.helper osxkeychain
git config --global core.editor ‘subl -w'
git config --global mergetool.sublime.cmd=subl -w $MERGED
git config --global mergetool.sublime.trustexitcode=false
git config --global merge.tool=sublime


Anyone else have any neat things?
Addition Dec-09:
——————

git config --global branch.autosetuprebase always

And for any existing branches that you want to “convert" over to always rebase, execute this in bash:

for branch in $(git for-each-ref --format='%(refname)' -- refs/heads/); do git config branch."${branch#refs/heads/}".rebase true; done



Here’s another little snippet for anyone interested… I wanted my Terminal prompt to show all active sessions, for easy memory that I may have left one open, but it was a little dull. Being a Viz guy, what can I do to make it better? _Add Color!_ So below, find a little fish function that you can add to your fish_prompt.fish script (or just call manually) for nicely colored output as shown above.

# xeno_list
function xeno_list --description 'Colored xeno list results'
for s in ( xeno-list );
set -l xen_id (echo $s | cut -d ':' -f 1)
set -l xen_desc (echo $s | cut -d ':' -f 2-)

set -l xen_array (echo $s | tr ' ' \n)
set -l statuscolor (set_color green)
set -l desccolor (set_color normal)
if test $xen_array[-1] = "unsynced"
set statuscolor (set_color red)
set desccolor (set_color yellow)
end

echo $statuscolor\[$xen_id\] $desccolor $xen_desc (set_color normal)
end
end

A month ago or so, a friend of mine turned me onto a new unix shell called [FishShell](http://fishshell.com). It shares some similarities with other terminals, but offers lots of really nice features. It has a vastly improved autocompletion feature (including an amazing tool that parses all your installed man pages and generates an autocompletion database). It took me a while to work out some of the syntax (no more `export PATH=A`, but rather `set -x PATH A`). I’ve switched all my machines over to it (Mac and Linux) and I’m loving it so far.

Then, the other day I found out about a great tool called [Xeno.io](http://xeno.io). It’s a tool that combines git and ssh into a single stream that lets you edit remote files with local editors. I hooked it up with Sublime (a quick `set -xU EDITOR subl` and `set -xU GIT_EDITOR ’subl -w’`), and it’s a great way to edit code on remote systems without having to use screen and such. And if your connection drops, no worries! It’s stored in git, and when it comes back it’ll resync.

So I spent some time merging the two, and build the following nice autocompletes for xeno that support the major operations, and filling in open sessions. Hope someone out there finds it useful!

# xeno
#
#


function __fish_xeno_available_sessions
xeno-list | cut -d ':' -f 1
end


function __fish_xeno_needs_command
set cmd (commandline -opc)
if [ (count $cmd) -eq 1 -a $cmd[1] = 'xeno' ]
return 0
end
return 1
end


function __fish_xeno_using_command
set cmd (commandline -opc)
if [ (count $cmd) -gt 1 ]
if [ $argv[1] = $cmd[2] ]
return 0
end
end
return 1
end


complete -f -c xeno -n '__fish_xeno_needs_command' -a 'list stop resume sync ssh edit'


complete -f -c xeno -n '__fish_xeno_needs_command' -a list --description 'List open sessions'
complete -f -c xeno -n '__fish_xeno_using_command list' -a '(__fish_xeno_available_sessions)'


complete -f -c xeno -n '__fish_xeno_needs_command' -a stop --description 'Shutdown a session'
complete -f -c xeno -n '__fish_xeno_using_command stop' -a '(__fish_xeno_available_sessions)'


complete -f -c xeno -n '__fish_xeno_needs_command' -a resume --description 'Resume a previously used session'
complete -f -c xeno -n '__fish_xeno_using_command resume' -a '(__fish_xeno_available_sessions)'


complete -f -c xeno -n '__fish_xeno_needs_command' -a sync --description 'Force a sync of data'
complete -f -c xeno -n '__fish_xeno_using_command sync' -a '(__fish_xeno_available_sessions)'


complete -f -c xeno -n '__fish_xeno_needs_command' -a ssh --description 'SSH to a host to prepare for editing'
complete -f -c xeno -n '__fish_xeno_needs_command' -a edit --description 'Edit a file'


For the last year I’ve basically given up on the OSX Calendar App and been using the Google Calendar website in a Pinned Tab in Chrome. Although I still keep OSX Calendar synced and up to date, it became a real problem that it didn’t support Google Hangouts invitations. As such, I found myself having to manually open Google Calendar before each meeting, hunting down the meeting and then joining the hangout. I basically had given up on ever finding a solution, but today Google finally won out.

On StackOverflow I found a similar individual with this problem, and someone posted a clever combination of automator, shell scripts, and AppleScript that makes it possible. It’s not perfect, but now I can drag events from OSX Calendar to a little icon in my dock that reprocesses them and adds the hangout URL back in under the URL field, making it a simple click-and-launch to get into hangouts.

For simplicity, the procedure (with my one modification) is:

  1. Create an Automator of type application
  2. Add 'GetSpecified Finder' Items step
  3. Add a 'Run Shell Script’ step, and change the inputs to be As Arguments, not As Stdin.
  4. Copy the following into the text box:

      read url &lt;&lt;&lt; $(cat "$1" | sed "s/$(printf '\r')\$//" | awk -F':' '/X-GOOGLE-HANGOUT/ {first = $2":"$3; getline rest; print (first)(substr(rest,2)); exit 1}’;)
      read uid &lt;&lt;&lt; $(cat "$1" | sed "s/$(printf '\r')\$//" | awk -F':' '/UID/ {print $2; exit 1}’;)
      echo "$url”
      echo "$uid”
    
  5. Add a step of type 'Run Apple Script’

  6. Copy the following into the box replacing "myCalendar" with the name of your calendar:

      on run {input, parameters}
           set myURL to input's item 1
           set myUID to input's item 2
           set myCal to “myCalendar"
           tell application “Calendar"
                tell calendar myCal
                     set theEvent to first event whose uid = myUID    
                     set (url of theEvent) to myURL
                end tell    
          end tell
          return input
      end run
    
  7. Save the Application and add to your dock


I've been following the tool "ThinkUp" off an on for a while now. Gina Trapani, founder of LifeHacker long ago, created it as a hobby project and it was described as lots of fuzzy things like "social media insights engine". I was never very sure what it was, but every now and then I would hear neat things about it.

I noticed last week that apparently they've decided to create a commercial entity now around it and launch a whole website where you could just access it, instead of the current model where you have to download and install it on your own LAMP stack after some configuration.

So I figured if they saw it worth that, then maybe I should experiment with it. So I downloaded it (version 2.0-beta8, available on GitHub) and set it up on a system of mine. After a failed start or two I got it setup and was rather unimpressed. I kept tinkering with it and just couldn't see why I should care. But I left it running...

In my head I kept thinking about it and figured that since it's already pulling in all my FourSquare, Twitter, and Facebook history, it could create a nice Timeline view. Similar to TimeHop, but rather than showing me today in the past, just showing me today. Then I could take the results in a nice interleaved temporal format and toss them in Evernote for archive. ThinkUp has no internal capability for this, but has a decent plugin architecture and rather simple MySQL format that should make building this easy.

Then, I got to thinking about what else I could add. I use FitBit, so I wondered if I could get my FitBit steps & sleep stats pulled in and merged into the timeline as well. And hey, how about MyFitnessPal to show what/when I ate. And then RunKeeper blended in as well to see distance/speed. Oh, and I could access EverNote to pull in when, what, and where notes were created. And if I could get access to my PlaceMe stream I could merge that in too.

In short, I was fantasizing about building a single tool that could build a daily "Day in my Life", showing me everything I did and everywhere I went on a minute-to-minute basis. It's kinda scary to think that I leave that much "digital dust" around the internet, but it's reality. I've already begun on the FitBit integration and have a basically functioning Crawler and graph generator that shows my daily steps down at the 5-minute increment. It needs a bit of integration work to clean it up, and then a smarty template and such to get it viewable inside ThinkUp in a reasonable way, but I intend to release it on my GitHub fork once it's ready.

Oh, and while I was tinkering with all this and doing other stuff, ThinkUp kept churning away with it's hourly updates. And when I logged in the other day I finally started to catch a glimpse of what it really does. It took a few days to gather enough data, but now it shows me some interesting statistics like:
  • Most of my Tweets are questions
  • Most of my status updates are personal (containing "i", "me", "my", etc)
  • Some neat graphs about topic vs reply/like count
  • Interesting graphs about which of my friends most commonly respond
So I'm going to let it keep going. I figure it can't hurt to get some information like this while I integrate the rest of the functions I want.

Also, I might need to find a better place to host it for reliability. Right now it's just running on a system internal to my own network, so to get it I have to establish an SSH tunnel and such. It works, and it's safe so that's a plus, but it's then at the (somewhat flakey) reliability of my own network and hardware. I saw a post where you can theoretically now run ThinkUp on Google App Engine's new PHP support, but I can't find any information on how it actually performs or where anyone's ever tried it. GAE has a 60-second limit on scripts which, in my experience, would cripple crawling, so I'm curious to know if anyone's tried it.

However, I am a bit concerned about the future of ThinkUp. The latest version available is 2.0-beta8, with no commit's on the "main" branch in 2 months. I see lots of activity on forks, including a few important bugfixes that are critical to get it running with recent changes in Facebook, but none of them have been merged back to the master. Surely if they're preparing to roll out a company they've fixed them internally. I fear that they may internally fork the project into an "Awesome for-pay version" and "Less functional but open-source version that's always a bit out of date".

I guess only time will tell.......

Some of you may see that pic above and think that's from the weekend's Tropical Storm Karen, but you'll be wrong. That's actually just from a heavy rain about 2 weeks ago (which caused lots of problems I'm still dealing with).

We were all prepared for it to happen again this weekend as Tropical Storm Karen came through, but it (thankfully) wound up a letdown. It never made landfall, and we only got about an hour of rain from it in total.

All in all, 2013 has been a beautifully quiet Hurricane Season. Of course, that just means next year has the potential to be twice as bad.

So, unless you live out of state or under a rock, you've undoubtedly heard that the Mississippi State Fair is in town. I had to work late on Friday night, and with Tropical Storm Karen bearing down, Laura decided to take the kids on Friday Night by herself in case the rain caused problems over the weekend.

While I didn't get to go, she kept me in the loop with lots of SMS Pics of the night. But, my favorite one has to be the one above where my son got on a ride a bit bigger than he's ever experienced before. The look of sheer terror, combined with the complete determination to do it no matter what.

Saw this today… seems about right.



With all the talk of "government shutdown", I've been talking to my government friends from my former life. Much to my surprise, they've all been labeled "non-essential". However, they all have leftover FY13 funds so they can continue to work and collect paychecks.

Which begs the question: If the government is funding people and departments so much extra that they can fund their own paycheck into the next year, maybe we should look at that to fix some of our budget problems?

I logged in to connect this new Yeraze.com website to my OpenID (hosted through MyOpenID . Unfortunatley, I logged in to find out that MyOpenID is closing on February 4th, 2014. Luckily, OpenID is built with this in mind so you can use "OpenID Delegation". This means I actually login with the url www.yeraze.com, but a few lines of HTML at the top of the file determine where to reroute that request. So, in the event of my chosen OpenID provider shutting down, I can switch to another.

Thanks to this StackOverflow post I was able to set it up to redirect to Google. Now, I can keep logging in with my Yeraze.com URL, and it just redirect me to a Google authentication.

Now, you may be asking why I don't use regular Google authentication. Going through OpenID, all I expose is my email address, rather than full Google Authentication which exposes all kinds of contact, email, calendar, and other data I'ld rather not disclose.