Fulldome Development for Interactive Immersive Training : Entry 1

At the Digital Dome @ IAIA, we have began our interactive fulldome research.

Software Developer, Charles Veasey flying through a colorful test

Some of our goals include:

  • Running the dome from a single computer (versus the 8 networked currently being used)
  • Playing dome masters videos as .mov’s in real-time
  • Running video game environments for the dome, including Unity3D
  • Displaying OpenGL scenes within the dome
  • Setting up a fulldome VJ environment using modul8 and Vidvox
  • Calibrating input devices such as: Wii controllers, Microsoft Kinect, multi-touch screens and trackpads
  • Rendering data from the motion capture system on the dome
  • Setting up the dome for telepresence performances

We started out by installing the DomeGL software developed by the University of New Mexico’s ARTS Lab. DomeGL is a real-time projection and calibration system for fulldome environments written in C++ and OpenGL. It calibrates an OpenGL scene that is mapped seamlessly to the six physical projectors of the dome. The results were quite good after only a single calibration. The software is still in development, so the calibration process requires editing the source code. We’ll be stream-lining this process, so non-programmers can easily use the calibration utilities.

The UNM ARTS Lab has prepared about a dozen demo apps to showcase their DomeGL software.Several of these demos use the Wii Controller and Wii Fit Balance Board. Domesteriods (see images and video) uses the Wii Fit Balance Board to navigate your space ship through a 3D environment and uses the Wii controller to shoot asteroids as they fly towards you.

[youtube http://www.youtube.com/watch?v=X8UuAjIfse0]

The next projection system we tested was MadMapper by garagecube. MadMapper allows you to map OpenGL textures onto quad primitives and warp these primitives to create projections over non-uniform surfaces. We used a dome master image to see if we could create a seamless projection over the surface of the dome. The results were pretty good.  Next, we will attempt to place a dome grid on the dome using MadMapper.

Mapping six projectors using MadMapper.

One of the more intriguing aspects of MadMappper is that it supports Syphon inputs. Syphon is an open source framework that allows applications to share OpenGL surfaces between applications. This means that it is now possible to route, for example, video frames which are rendered on the GPU, to another application with almost no additional overhead. As a quick test we opened up Max/MSP/Jitter and used it to play a 2k dome master. Using Syphon we routed the video frames to MadMapper. We were able to play a 2k video encoded with the Quicktime animation codec at 15 fps! This is a very promising first attempt.

2 comments on “Fulldome Development for Interactive Immersive Training : Entry 1

    • We just happened to test a video that had an animation codec. Since we were able to achieve a decent frame rate with an animation codec, we know it will work. We will definitely test a video with Pro-Res 422 once we lock down mapping a grid.
      Thanks for the comment!

Comments are closed.