Processing + NyARToolkit + multiple marker tracking

For various reasons, I need to do multiple marker tracking in processing with NyARToolkit.  However, with the default NyAR4psg layer between these two, multiple marker tracking is downright hard, and when you get it working, it’s not quite what you expect. After a few days of Java hacking, during which I was very pleasantly surprised with eclipse, I am now pleased to present to you my modifications to the NyAR4psg that makes multiple marker tracking easy! See here:

Standard hiro and kanji markers tracked simultaneously with augmented reality sphere and cube. In the background some artwork by my daughter!

I’ve called it NyARMultiBoard, and you can use it instead of the default NyARBoard if you want to track multiple markers.

Download a ZIP file containing everything (source code, jar files) from this directory.  If you unpack this into your processing sketchbook/libraries directory, it should work out of the box.  It’s a drop-in replacement for NyAR4psg, so you don’t need to have that installed as well. There is an example to get you started in NyAR2/example/NyARMultiTest.  Note: This uses the GSVideo capturing stack as I explain here, you should easily be able to change it back to processing defaults (just change GSCapture to Capture).

Please let me know in the comments if this works (or doesn’t) for you!

I made this screencast to demonstrate the multiple marker tracking, assisted by TNR:

I also made this really bad screencast (old webcam + night time lighting + transcoding):

If you’re really into the details

I’ve just added two new classes NyARMultiBoard and NyARMultiBoardMarker to the default NyAR4psg distribution. Very importantly, NyARToolkit itself needs to be patched with one extra method in NyARDetectMarker, see the NyARMultiBoard comments.

Update on 20110304

I’ve fixed the problematic frame bug in gsvideo that many of you have been running into. See this post.

Update on 20110305

I’ve updated NyAR2 so it works with the P3D renderer as well, which is often faster for blitting the webcam image onto the display. The updated zip file is named, and it can be downloaded from the usual directory. My changes are based on NyAR4psg 0.3.0 and NyARToolkit 2.5.2.

Processing + GSVideo + NyARToolkit on Linux x86_64

Every now and then, I blast out the cruft from my nerd gland’s exit duct by writing a terribly nerdy post.  This is just such a post, so if you don’t speak Nerd, i’d highly recommend that you go have some fun elsewhere, at least until my next Weekly Head Voices of course!

As mentioned around these parts, I’m currently playing with Processing, a beautiful programming stack for making interactive visual, err, thingies. To be more specific, I’d like to use processing together with something like ARToolkit to do real-time 3D tracking of markers in live video, for augmented reality fun.  To see what this could look like, see this YouTube video:

Today’s challenge is getting whole stack, including processing, the GSVideo video capture library for processing and the NyARToolkit augmented reality for processing going on Linux x86_64 (64bit).  On Linux x86 (32bit) this is much more straight-forward, but I wouldn’t write a blog post about straight-forward, now would I?

Here is the recipe:

  1. Make sure you have the native 64bit Sun JDK installed for your system.  On this Ubuntu 9.10 machine it’s sun-java6-jdk 6-15-1, on Ubuntu 10.04 (also tested) it’s 6.20dlj-1ubuntu3.
  2. Also install the jogl libraries, on this machine called libjogl-java.
  3. Make sure you have the whole of gstreamer installed. On ubuntu, all packages containing “gstreamer”.
  4. Get and unpack the processing for Linux tarball (I’ve tested this whole procedure with processing 1.0.9, 1.1 and 1.2.1) from the processing download site.
  5. In the unpacked processing directory, remove the whole java subdirectory. Now make a symlink pointing to your system java directory (the one containing bin, ext, jre, lib, etc.).  On my system, that was:
    cd processing
    rm -rf java
    ln -s /usr/lib/jvm/java-6-sun- java
  6. In processing/libraries/opengl/library remove the 3 libjogl*.so files and Symlink their replacements from /usr/lib/jni with for example:
    cd processing/libraries/opengl/library
    rm lib*.so
    ln -s /usr/lib/jni/
    ln -s /usr/lib/jni/
    ln -s /usr/lib/jni/
  7. Download and unpack gsvideo into processing/libraries.  You should be able to run the examples in processing/libraries/gsvideo/examples/Capture with the PDE (Processing Development Environment).
  8. Download and unpack the NyARToolkit for Processing library into processing/libraries.

You should now be able to run the NyARToolkit examples by changing replacing the import call as follows:

// replace this call:
// import*;
// by this call:
import codeanticode.gsvideo.*;

and changing the Capture class (twice) to GSCapture and perhaps also the capture resolution, depending on your camera. The relevant conversions are:

// Capture cam;
// becomes:
GSCapture cam;

// later, in setup():
// cam=new Capture(this,width,height);
// becomes:
cam=new GSCapture(this,width,height);

The major trick in all of this is converting your Processing installation to use your system 64bit JDK instead of its own built-in 32bit JDK.

Let me know in the comments if this worked (or didn’t) for you!