Here's what I did for my class project in Sensing and Planning in Robotics . This was done as part of a 5 member team. The gesture recognition is tracked with an Inertial Measurement Unit that I hacked up. We use Hidden Markov Models to identify gestures. The robots actually track the leader through a pair of Wiimotes on each robot that track a Sensor bar on the leader. This is one project that uses 5 different languages on the same framework. :P
Oh and for those who might have questions feel free to comment. :)