I watched the recent Kinect SDK launch which gave me some ideas about how to use the Kinect. One interesting point was that when designing Kinect applications, design them from scratch so don't just take a previous application and implement a Kinect interface. This lead me to the idea to use kinect as more a live virtual instrument. Whereas before I was going to have an application similar to a digital audio workstation which would be controlled with Kinect (as the posted image shows).
Now I will concentrate on implementing different ways to interface with a live virtual instrument. I am going to have it set up as a sever/client application, so that potentially many clients could play an instrument together. Right now I now I have most of the server done. The server is designed to accept MIDI events from the clients, so it can pass these into the virtual instruments and play back the sound.
For each VST instrument I am planning to have an XML document associated with it. This will read when the instrument is loaded in. It will contain the information needed to know how to manipulate it when it is being used. For example, when the right hand is moved up it will change the filter cut-off frequency between these values. I am planning on having different ways to interact with the instruments, so the XML file will show which ways can be used with which instruments and how they will work.