Title:Gesture controlled Musical Conducting
The idea is to use a Kinect and/or Leap Motion gestural input device to device methods to allow for musical conducting. This basically means recognising and mapping appropriate bodily gestures into MIDI messages that can control some digital audio workstation. Machine Learning methods will be used to recognise the gestures.
Deliverables: Initial plan
Final report
Student: Kieran Flay
Supervisor: Dave Marshall
Moderator: Alia I Abdelmoty
Report: Archive