We are very excited to announce that with the vDome software, we have achieved real-time interactivity! Now we can VJ in the dome, hook up external sensors and the audience can generate interactive content such as painting generated by movement, and we can move an avatar within a 3D environment. We are developing a public show that showcases all of this and more to take place during the ISEA2012 (International Symposium of Electronic Arts) conference that happens in Albuquerque and Santa Fe in late September.
vDome was created through Max/MSP/Jitter. This allows us to use Syphon to connect various software applications. We have now successfully connected Modul8 and run VJ tests on the dome. VJ, Jane DaPain was with us a couple weeks ago and was able to not only have Modul8 working in real-time in our 6 projector 2k dome, but she ran it all from her iPad using Touch OSC – Amazing! This opens up a whole world of possibilities!
We also ran tests using canned Max/MSP examples, which can be viewed in the videos below (Please note that we have not an alignment calibration recently and some of the alignment is off. We have been moving the dome around a lot lately):
In this video, we show that we are able to move particle generator around the dome without lag.
In this video, we show that we are able to move an avatar in 3D space without lag.
In this video, we show that we are able to paint on the dome without lag.
All of these test we run from one 12-core Mac Pro computer with two Quadro 4000 cards, two Triple Heads2Go, and a mouse to control the movement. The next steps as we move forward:
- Test interactivity using external sensors such as kinect, web cams, IR, and wii controllers
- Invite VJs and DJs into the dome to showcase its awesomeness
- create an interactive art piece to showcase during ISEA
- work with artists to create in the dome using the interactivity
- Connect to our Organic Motion Markerless motion capture system to create real time avatar performance in the dome.