Extraterrestrial Map Kinections


Fig 1 – LRO Color Shaded Relief map of moon – Silverlight 5 XNA with Kinect interface


Silverlight 5 was released after a short delay, at the end of last week.
Just prior to exiting stage left, Silverlight, along with all plugins, shares a last aria. The spotlight now shifts abruptly to a new diva, mobile html5. Backstage the enterprise awaits with a bouquet of roses. Their concourse will linger long into the late evening of 2021.

The Last Hurrah?

Kinect devices continue to generate a lot of hacking interest. With the release of an official Microsoft Kinect beta SDK for Windows, things get even more interesting. Unfortunately, Kinect and the web aren’t exactly ideal partners. It’s not that web browsers wouldn’t benefit by moving beyond the venerable mouse/keyboard events. After all, look at the way mobile touch, voice, inertia, gyro, accelerometer, gps . . . have all suddenly become base features in mobile browsing. The reason Kinect isn’t part of the sensor event farmyard may be just a lack of portability and an ‘i’ prefix. Shrinking a Kinect doesn’t work too well as stereoscopic imagery needs a degree of separation in a Newtonian world.

[The promised advent of NearMode (50cm range) offers some tantalizing visions of 3D voxel UIs. Future mobile devices could potentially take advantage of the human body’s bi-lateral symmetry. Simply cut the device in two and mount one half on each shoulder, but that isn’t the state of hardware at present. ]


Fig 2 – a not so subtle fashion statement OmniTouch


For the present, experimenting with Kinect control of a Silverlight web app requires a relatively static configuration and a three-step process: the Kinect out there, beyond the stage lights, and the web app over here, close at hand, with a software piece in the middle. The Kinect SDK, which roughly corresponds to our visual and auditory cortex, amplifies and simplifies a flood of raw sensory input to extract bits of “actionable meaning.” The beta Kinect SDK gives us device drivers and APIs in managed code. However, as these APIs have not been compiled for use with Silverlight runtime, a Silverlight client will by necessity be one step further removed.

Microsoft includes some rich sample code as part of the Kinect SDK download. In addition there are a couple of very helpful blog posts by David Catuhe and a codeplex project, kinect toolbox.

Step 1:

The approach for using Kinect for this experimental map interface is to use the GestureViewer code from Kinect Toolbox to capture some primitive commands arising from sensory input. The command repertoire is minimal including four compass direction swipes, and two circular gestures for zooming, circle clockwise zoom in, and circle counter clockwise zoom out. Voice commands are pretty much a freebie, so I’ve added a few to the mix. Since GestureViewer toolbox includes a learning template based gesture module, you can capture just about any gesture desired. I’m choosing to keep this simple.

Step 2:

Once gesture recognition for these 6 commands is available, step 2 is handing commands off to a Silverlight client. In this project I used a socket service running on a separate thread. As gestures are detected they are pushed out to local port 4530 on a tcp socket service. There are other approaches that may be better with final release of Silverlight 5.

Step 3:

The Silverlight client listens on port 4530, reading command strings that show up. Once read, the command can then be translated into appropriate actions for our Map Controller.


Fig 3 – Kinect to Silverlight architecture

Full Moon Rising


But first, instead of the mundane, let’s look at something a bit extraterrestrial, a more fitting client for such “extraordinary” UI talents. NASA has been very busy collecting large amounts of fascinating data on our nearby planetary neighbors. One data set that was recently released by ASU, stitches together a comprehensive lunar relief map with beautiful color shading. Wow what if the moon really looked like this!


Fig 4 – ASU LRO Color Shaded Relief map of moon

In addition to our ASU moon USGS has published a set of imagery for Mars, Venus, Mercury, as well as some Saturn and Jupiter moons. Finally, JPL thoughtfully shares a couple of WMS services and some imagery of the other planets:

This type of data wants to be 3D so I’ve brushed off code from a previous post, NASA Neo 3D XNA, and adapted it for planetary data, minus the population bump map. However, bump maps for depicting terrain relief are still a must have. A useful tool for generating bump or normal imagery from color relief is SSBump Generator v5.3 . The result using this tool is an image that encodes relative elevation of the moon’s surface. This is added to the XNA rendering pipeline to combine a surface texture with the color relief imagery, where it can then be applied to a simplified spherical model.


Fig 5 – part of normal map from ASU Moon Color Relief imagery

The result is seen in the MoonViewer client with the added benefit of immediate mode GPU rendering that allows smooth rotation and zoom.

The other planets and moons have somewhat less data available, but still benefit from the XNA treatment. Only Earth, Moon, Mars, Ganymede, and Io have data affording bump map relief.

I also added a quick WMS 2D viewer html using OpenLayers against the JPL WMS servers to take a look at lunar landing sites. Default OpenLayers isn’t especially pretty, but it takes less than 20 lines of js to get a zoomable viewer with landing locations. I would have preferred the elegance of Leaflet.js, but EPSG:4326 isn’t supported in L.TileLayer.WMS(). MapProxy promises a way to proxy in the planet data as EPSG:3857 tiles for Leaflet consumption, but OpenLayers offers a simpler path.


Fig 6 – OpenLayer WMS viewer showing lunar landing sites

Now that the Viewer is in place it’s time to take a test drive. Here is a ClickOnce installer for GestureViewer modified to work with the Silverlight Socket service:

Recall that this is a Beta SDK, so in addition to a Kinect prerequisite, there are some additional runtime installs required:

Using the Kinect SDK Beta

Download Kinect SDK Beta 2:

Be sure to look at the system requirements and the installation instructions further down the page. This is Beta still, and requires a few pieces. The release SDK is rumored to be available the first part of 2012.

You may have to download some additional software as well as the Kinect SDK:

Finally, we are making use of port 4530 for the Socket Service. It is likely that you will need to open this port in your local firewall.

As you can see this is not exactly user friendly installation, but the reward is seeing Kinect control of a mapping environment. If you are hesitant to go through all of this install trouble, here is a video link that will give you an idea of the results.

YouTube video demonstration of Kinect Gestures


Voice commands using the Kinect are very simple to add so this version adds a few.

Here is the listing of available commands:

       public void SocketCommand(string current)
            switch (command)
                    // Kinect voice commands
                case "mercury-on": { MercuryRB.IsChecked = true; break; }
                case "venus-on": { VenusRB.IsChecked = true; break; }
                case "earth-on": { EarthRB.IsChecked = true; break; }
                case "moon-on": { MoonRB.IsChecked = true; break;}
                case "mars-on": { MarsRB.IsChecked = true; break;}
                case "marsrelief-on": { MarsreliefRB.IsChecked = true; break; }
                case "jupiter-on": { JupiterRB.IsChecked = true; break; }
                case "saturn-on": { SaturnRB.IsChecked = true; break; }
                case "uranus-on": { UranusRB.IsChecked = true; break; }
                case "neptune-on": { NeptuneRB.IsChecked = true; break; }
                case "pluto-on": { PlutoRB.IsChecked = true; break; }

                case "callisto-on": { CallistoRB.IsChecked = true; break; }
                case "io-on": { IoRB.IsChecked = true;break;}
                case "europa-on": {EuropaRB.IsChecked = true; break;}
                case "ganymede-on": { GanymedeRB.IsChecked = true; break;}
                case "cassini-on": { CassiniRB.IsChecked = true; break; }
                case "dione-on":  {  DioneRB.IsChecked = true; break; }
                case "enceladus-on": { EnceladusRB.IsChecked = true; break; }
                case "iapetus-on": { IapetusRB.IsChecked = true;  break; }
                case "tethys-on": { TethysRB.IsChecked = true; break; }
                case "moon-2d":
                        MoonRB.IsChecked = true;
                        Uri uri = Application.Current.Host.Source;
                        System.Windows.Browser.HtmlPage.Window.Navigate(new Uri(uri.Scheme + "://" + uri.DnsSafeHost + ":" + uri.Port + "/MoonViewer/Moon.html"), "_blank");
                case "mars-2d":
                        MarsRB.IsChecked = true;
                        Uri uri = Application.Current.Host.Source;
                        System.Windows.Browser.HtmlPage.Window.Navigate(new Uri(uri.Scheme + "://" + uri.DnsSafeHost + ":" + uri.Port + "/MoonViewer/Mars.html"), "_blank");
                case "nasaneo":
                        EarthRB.IsChecked = true;
                        System.Windows.Browser.HtmlPage.Window.Navigate(new Uri(""), "_blank"); break;
                case "rotate-east": {
                        RotationSpeedSlider.Value += 1.0;
                        tbMessage.Text = "rotate east";
                case "rotate-west":
                        RotationSpeedSlider.Value -= 1.0;
                        tbMessage.Text = "rotate west";
                case "rotate-off":
                        RotationSpeedSlider.Value = 0.0;
                        tbMessage.Text = "rotate off";
                case "reset":
                        RotationSpeedSlider.Value = 0.0;
                        orbitX = 0;
                        orbitY = 0;
                        tbMessage.Text = "reset view";

                //Kinect Swipe algorithmic commands
                case "swipetoleft":
                        orbitY += Microsoft.Xna.Framework.MathHelper.ToRadians(15);
                        tbMessage.Text = "orbit left";
                case "swipetoright":
                        orbitY -= Microsoft.Xna.Framework.MathHelper.ToRadians(15);
                        tbMessage.Text = "orbit right";
                case "swipeup":
                        orbitX += Microsoft.Xna.Framework.MathHelper.ToRadians(15);
                        tbMessage.Text = "orbit up";
                case "swipedown":
                        orbitX -= Microsoft.Xna.Framework.MathHelper.ToRadians(15);
                        tbMessage.Text = "orbit down";

                //Kinect gesture template commands
                case "circle":

                        if (scene.Camera.Position.Z > 0.75f)
                            scene.Camera.Position += zoomInVector * 5;
                        tbMessage.Text = "zoomin";
                case "circle2":
                        scene.Camera.Position += zoomOutVector * 5;
                        tbMessage.Text = "zoomout";

Possible Extensions

After posting this code, I added an experimental stretch vector control for zooming and 2 axis twisting of planets. These are activated by voice: ‘vector twist’, ‘vector zoom’, and ‘vector off.’ The Map control side of gesture commands could also benefit from some easing function animations. Another avenue of investigation would be some type of pointer intersection using a ray to indicate planet surface locations for events.


Even though Kinect browser control is not prime time material yet, it is a lot of experimental fun! The MoonViewer control experiment is relatively primitive. Cursor movement and click using posture detection and hand tracking is also feasible, but fine movement is still a challenge. Two hand vector controlling for 3D scenes is also promising and integrates very well with SL5 XNA immediate mode graphics.

Kinect 2.0 and NearMode will offer additional granularity. Instead of large swipe gestures, finger level manipulation should be possible. Think of 3D voxel space manipulation of subsurface geology, or thumb and forefinger vector3 twisting of LiDAR objects, and you get an idea where this could go.

The merger of TV and internet holds promise for both whole body and NearMode Kinect interfaces. Researchers are also adapting Kinect technology for mobile as illustrated by OmniTouch.

. . . and naturally, lip reading ought to boost the Karaoke crowd (could help lip synching pop singers and politicians as well).


Fig 7 – Jupiter Moon Io

Posted in Uncategorized