Over the past few weeks I have been slowly refactoring some of my Kinect controller code. I have it so at the start of the frame it checks the posture of the user and if it matches one of the programmed gestures we can trigger a gesture start event. These gestures can also be used when doing other work with the kinect, for example we do not want to play an instrument when two hands are down below hips.
After this we can use the interface IKinectWork to pass in the skeleton and process it, this allows me to extend this to many other applications. For now I have DrumWork and GuitarWork. Both these use the kinect in many different ways.
Right now I completed work on the DrumWork. The user can just hit places defined by the program to play the drums, I want to show this on the screen somehow. Also I am working on using speed to determine other the volume of the drum sounds. I am trying to find a way to best reuse code in instruments