A few weeks back I joined a team to work on an art project for Burning Man which involves using Kinect, Ableton Live, and Quartz Composer to create a synaesthetic experience. You can follow the progress of that project at http://projectsynapse.tumblr.com.
To enable the other members of my team to create this experience, I made an application that sends joint and event data via OSC, and some Max for Live patches to use those messages to control Ableton, along with a plugin for Quartz Composer to read in the depth buffer image.
I realized that these tools were very general in application and easy enough to use that they would be of benefit to others as well, so I’m distributing them here. My hope is that some talented people will get their hands on these tools and create some truly awesome performances with them.
Before emailing me about crashes, please make sure you Kinect is properly plugged in to the computer and the wall.