Vive Cane

I’m still alive just … busy, not bloggy … maybe someday I’ll be more bloggy. Here’s something that kept me busy …

Teleporting everywhere feels wrong, so my suggestion is to use the Vive wand like a cane. Two minutes feels a bit long, but here’s a video showing it off.

More or less; when you grip the/a wand - your avatar is planting a cane in the world from which you can push yourself. When the grip is held - I constantly offset the “foot” of the avatar and I can ensure that the wand’s (in game) position (in VR) doesn’t change. With two wands, you can crawl around in VR (which CryTek already worked out) which opens some interesting possibilities.

At this point … I don’t really have much to say about this, it is what it is - an amusing way to avoid joystick-motion sickness. There’s a handful of honey-dos I’d like to chase down with it but … 9-5, home made pizzas, social life, StarCraft’s Co-Op, and my-own build tool all compete for my attention.

If someone is reading this in the future - reach me on Twitter if you have a question or want a follow up.

Adding NotePad++ Macros to Atom.io

I haven’t posted anything in awhile … so here’s how to get Atom.io to get macros that work kind-of-like NotePad++


  1. Install atom-keyboard-macros into Atom.

    The default keybindings did nothing for me … sorry

  2. open your keybindings.cson

    • Hit CTRL+, > click on Keybindings > click on the blue text that says your keymap file
  3. paste this wodge into the bottom of your keybindings.cson PRESERVE THE INDENTATION!

    1
    2
    3
    4
    5
    6
    # almost NotePad++ macros for Atom.io!
    # based on https://github.com/JunSuzukiJapan/atom-keyboard-macros
    'atom-text-editor':
    'ctrl-shift-r': 'atom-keyboard-macros:start_kbd_macro'
    'ctrl-alt-r': 'atom-keyboard-macros:end_kbd_macro'
    'ctrl-shift-p': 'atom-keyboard-macros:call_last_kbd_macro'
  4. there is no step 4

Mirroring Git/GitHub to Hg/BitBucket

This seemed a lot longer when I planned it in my notebook at lunch.

GitHub user project (both sides) BitBucket user SCM Schedule
ocornut imgui g-pechorin None Periodic
  1. Install hg-git
    • You’ll have to do this on the Jenkins server
    • You’ll have to do it either for the Jenkins user or all
    • I’m using a OsX machine as my host, so I was able to use easy_install to install hg-git and dulwich
  2. setup a project on GitHub
  3. create a Jenkins Freestyle project which runs periodically
    • Polling the SCM was NOT an option since there’s no default branch on GitHub
      • … this is a quirk of hg-git … I think
      • … IIRC/YRMV - so sling me a tweet or whatever if I’m wrong
  4. program the job to pull from git, push to hg, and ignore results of 1
    • this was only elaborate because I needed it to not-fail when there were no changes
      1
      2
      3
      4
      5
      6
      7
      8
      9
      10
      11
      12
      13
      14
      15
      16
      17
      18
      19
      20
      21
      #!/bin/bash
      if [ -d "imgui" ]; then
      echo "Re-Using ImGUI"
      cd imgui
      hg pull git+ssh://git@github.com:ocornut/imgui.git
      else
      echo "Cloning ImGUI"
      hg clone git+ssh://git@github.com:ocornut/imgui.git
      cd imgui
      fi
      hg push -f ssh://hg@bitbucket.org/g-pechorin/imgui
      retcode=$?
      if [ $retcode -eq 0 ] || [ $retcode -eq 1 ]; then
      exit 0
      else
      exit $retcode
      fi

Mercurial / Hg SubRepos

I’m still trying to catch up on stuff following Develop. I’ve decided to write a post about my experience(s) switching my work over to SubRepos.

I am unaware of the “reason” why they’re considered “bad.” Perhaps it’s a Unix thing? Maybe they don’t work as well as people feel that they should?

Whatever

I have a (secret) project called “nite-nite/“ in which I use and develop some public-domain headers. I want this public-domain stuff to be … well … public-domain and visible to all. Putting these into a Sub-Repository seemes approriate, so I started by setting up a a separate repository on BitBucket. Following the basic usage I cloned this into my existing working copy and set it up as directed;

1
2
3
C:\Users\peter\Desktop\nite-nite>hg clone ssh://hg@bitbucket.org/g-pechorin/pal-public-domain
C:\Users\peter\Desktop\nite-nite>echo pal-public-domain = pal-public-domain > .hgsub
C:\Users\peter\Desktop\nite-nite>hg add .hgsub

So far so good right? Well … not so much. The push command won’t work right with the setup we/I just used.

The fix is simple, the .hgsub file looks like this …

1
pal-public-domain = pal-public-domain

… and it needs to look like this …

1
pal-public-domain = ssh://hg@bitbucket.org/g-pechorin/pal-public-domain

So commit/amend the previous commit and push

1
2
C:\Users\peter\Desktop\nite-nite>hg commit -m "added public domain stuff"
C:\Users\peter\Desktop\nite-nite>hg push

I’m reasonably happy with this. As a bonus, I applied it to my blog and the embedded Unity project can be embedded as source rather than a binary. Great, now I’ll get on with the actual work of moving those headers into the public-domain project.

vBlog-001

I fixed a bug. I don’t have a video of this bug - just the fix

IMAGE ALT TEXT HERE

So - two more bugs to go before I can add features

Stop Motion

Mother told me to try something different

Stop Motion @ YouTube

I made a stop-motion video (mostly to see if I could) … also, I wanted to see if I could record “blocked out” storyboards since I’m a crap pose drawing person.

I spent £2 on some pipe cleaners and a dopey phone stand. I took the pictures on my phone. I used some bluetack to help posing, pyOpenCV3 to encode the jpegs at 3 FPS, and ffmpeg to reduce the file size so that I could upload it in < 40 minutes.

The writeup took about 40 minutes … so maybe I didn’t save much time

I think the fella could use some firmer limbs - (maybe pasta tubes?) to make animation easier. I went looking for used-GI-Joe toys to produce storyboards from. I wanted the articulated hips and joints to show things like dudes slouching. With firmer limbs - I’d probably get smoother results … maybe …

It also might be good to have a steadier hand when taking the pictures.

Dual Hetero Quadro on Heaven

Literally the punchiest title I could come up with. I’ve been told that heterogeneous GPU setups are ridiculously slower than a single-GPU. This is largely an anecdotal shrug of “hey - a second GPU doesn’t really slow my computer down in any meaningful manner at all!” I’m sure that I did this all wrong and that the GPU is capable of being tweaked into a setting where this all becomes conclusively - the legwork for that isn’t interesting to me so I haven’t done it.

TL:DR; A second GPU which doesn’t match up or SLI won’t give you cooties.

Background

My workstation came with an NVidia PNY Quadro K600 GPU, which was replaced with a K620. I decided to put the K600 back into my computer as a secondary card and run UniGine Heaven benchmark to see just how slow a third wheel makes it. All tests were carried out at full-screen-exclusive resolution but were otherwise using the Heaven Benchmark’s default settings for their namesake. Stereoscopy was disabled for the non-stereo setting, and I used 3d Vision as UniGine’s the stereo method. For stereo settings the 3D OpenGL Stereo profile was used with Stereo= Enabled and Display Mode= Generic Active Stereo.

Statistics

I really can’t see any reason that a consumer would use this setup - it just amused me.

Configuration Detail Setting Stereo Score Frames Per Second Minimum FPS Maximum FPS
K620 Extreme Yes 131 5.2 3.5 24.4
K620 Basic Yes 269 10.7 5.2 21.6
K620 Extreme No 281 11.2 7.3 22.5
K620 Basic No 592 23.5 15.0 42.2
K620 + K600 Extreme Yes 131 5.2 3.7 10.0
K620 + K600 Basic Yes 270 10.7 8.2 18.4
K620 + K600 Extreme No 280 11.1 7.2 22.0
K620 + K600 Basic No 591 23.5 14.8 40.5

Analysis

The relation between the numbers suggests that;

  • the second GPU does kind of slow it down measurably
  • the second GPU doesn’t slow it down noticeably
  • the Dual-GPU is a teensy bit helpful with Stereo rendering, but not quite worth the cost
  • in a real-game, I could tell the second GPU to do PhysX work and maybe see some improvement?

I may have left some junk running on the desktop during the single-GPU tests, so the difference could be skewed a bit. Regardless; I’m confident that adding the second GPU to this desktop hasn’t sucked-up all my PCIe lanes or something silly.

Questions;

  • how would this work in an AMD/NVidia mix? (Red/Green)
  • would a x86 OS work any good/evil with with x86 benchmark?
  • would something with PhysX really be that different good/evil?

Canadian Racing Geese

I have had several geese charge me - this is a poor approximation.

I was quite young and the geese were acclimated to humans, more importantly they knew how tasty the french fries and clam strips we carried were. I never had a chance, nor will I ever forget. Honking with an bestial hunger, the savage geese charged! Pursing me across the fried clam shop’s parking lot, my mother could only cackle as she scrambled for the camera.

To this day - I have not returned to that shop.

Bin Plugins / Python 3.5.1

I was playing with Python’s binary extension system and was impressed with the simplicity. I think that the usage of setup.py encourages a consistent ecosystem … as opposed to the more open conventions used by Java and CLR.

(I followed the generic instructions and they worked fine on Windows 8.1 - disregard the hype/hate!)

Example on GitHub

Samon 2

I wrote a whole post/blurb in one sitting!

Read it here

Also; Haoyi made a neat post about Build Tools (I mentioned his work at least once and wanted to be sure I had his deets and URL correct)